Convergence of Restarted Krylov Subspaces to Invariant Subspaces

نویسندگان

  • Christopher A. Beattie
  • Mark Embree
  • John Rossi
چکیده

The performance of Krylov subspace eigenvalue algorithms for large matrices can be measured by the angle between a desired invariant subspace and the Krylov subspace. We develop general bounds for this convergence that include the effects of polynomial restarting and impose no restrictions concerning the diagonalizability of the matrix or its degree of nonnormality. Associated with a desired set of eigenvalues is a maximum “reachable invariant subspace” that can be developed from the given starting vector. Convergence for this distinguished subspace is bounded in terms involving a polynomial approximation problem. Elementary results from potential theory lead to convergence rate estimates and suggest restarting strategies based on optimal approximation points (e.g., Leja or Chebyshev points); exact shifts are evaluated within this framework. Computational examples illustrate the utility of these results. Origins of superlinear effects are also described.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Locally Optimal and Heavy Ball GMRES Methods

The Generalized Minimal Residual method (GMRES) seeks optimal approximate solutions of linear system Ax = b from Krylov subspaces by minimizing the residual norm ‖Ax − b‖2 over all x in the subspaces. Its main cost is computing and storing basis vectors of the subspaces. For difficult systems, Krylov subspaces of very high dimensions are necessary for obtaining approximate solutions with desire...

متن کامل

Convergence Analysis of Restarted Krylov Subspace Eigensolvers

The A-gradient minimization of the Rayleigh quotient allows to construct robust and fastconvergent eigensolvers for the generalized eigenvalue problem for (A,M) with symmetric and positive definite matrices. The A-gradient steepest descent iteration is the simplest case of more general restarted Krylov subspace iterations for the special case that all step-wise generated Krylov subspaces are tw...

متن کامل

Sharp Ritz Value Estimates for Restarted Krylov Subspace Iterations

Gradient iterations for the Rayleigh quotient are elemental methods for computing the smallest eigenvalues of a pair of symmetric and positive definite matrices. A considerable convergence acceleration can be achieved by preconditioning and by computing Rayleigh-Ritz approximations from subspaces of increasing dimensions. An example of the resulting Krylov subspace eigensolvers is the generaliz...

متن کامل

Weak*-closed invariant subspaces and ideals of semigroup algebras on foundation semigroups

Let S be a locally compact foundation semigroup with identity and                          be its semigroup algebra. Let X be a weak*-closed left translation invariant subspace of    In this paper, we prove that  X  is invariantly  complemented in   if and  only if  the left ideal  of    has a bounded approximate identity. We also prove that a foundation semigroup with identity S is left amenab...

متن کامل

On a selective reuse of Krylov subspaces in Newton-Krylov approaches for nonlinear elasticity

1. Introduction. We consider the resolution of large-scale nonlinear problems arising from the finite-element discretization of geometrically non-linear structural analysis problems. We use a classical Newton Raphson algorithm to handle the non-linearity which leads to the resolution of a sequence of linear systems with non-invariant matrices and right hand sides. The linear systems are solved ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • SIAM J. Matrix Analysis Applications

دوره 25  شماره 

صفحات  -

تاریخ انتشار 2004